228 research outputs found

    State of the Art on Neural Rendering

    Get PDF
    Efficient rendering of photo-realistic virtual worlds is a long standing effort of computer graphics. Modern graphics techniques have succeeded in synthesizing photo-realistic images from hand-crafted scene representations. However, the automatic generation of shape, materials, lighting, and other aspects of scenes remains a challenging problem that, if solved, would make photo-realistic computer graphics more widely accessible. Concurrently, progress in computer vision and machine learning have given rise to a new approach to image synthesis and editing, namely deep generative models. Neural rendering is a new and rapidly emerging field that combines generative machine learning techniques with physical knowledge from computer graphics, e.g., by the integration of differentiable rendering into network training. With a plethora of applications in computer graphics and vision, neural rendering is poised to become a new area in the graphics community, yet no survey of this emerging field exists. This state-of-the-art report summarizes the recent trends and applications of neural rendering. We focus on approaches that combine classic computer graphics techniques with deep generative models to obtain controllable and photo-realistic outputs. Starting with an overview of the underlying computer graphics and machine learning concepts, we discuss critical aspects of neural rendering approaches. This state-of-the-art report is focused on the many important use cases for the described algorithms such as novel view synthesis, semantic photo manipulation, facial and body reenactment, relighting, free-viewpoint video, and the creation of photo-realistic avatars for virtual and augmented reality telepresence. Finally, we conclude with a discussion of the social implications of such technology and investigate open research problems

    Challenges in QCD matter physics - The Compressed Baryonic Matter experiment at FAIR

    Full text link
    Substantial experimental and theoretical efforts worldwide are devoted to explore the phase diagram of strongly interacting matter. At LHC and top RHIC energies, QCD matter is studied at very high temperatures and nearly vanishing net-baryon densities. There is evidence that a Quark-Gluon-Plasma (QGP) was created at experiments at RHIC and LHC. The transition from the QGP back to the hadron gas is found to be a smooth cross over. For larger net-baryon densities and lower temperatures, it is expected that the QCD phase diagram exhibits a rich structure, such as a first-order phase transition between hadronic and partonic matter which terminates in a critical point, or exotic phases like quarkyonic matter. The discovery of these landmarks would be a breakthrough in our understanding of the strong interaction and is therefore in the focus of various high-energy heavy-ion research programs. The Compressed Baryonic Matter (CBM) experiment at FAIR will play a unique role in the exploration of the QCD phase diagram in the region of high net-baryon densities, because it is designed to run at unprecedented interaction rates. High-rate operation is the key prerequisite for high-precision measurements of multi-differential observables and of rare diagnostic probes which are sensitive to the dense phase of the nuclear fireball. The goal of the CBM experiment at SIS100 (sqrt(s_NN) = 2.7 - 4.9 GeV) is to discover fundamental properties of QCD matter: the phase structure at large baryon-chemical potentials (mu_B > 500 MeV), effects of chiral symmetry, and the equation-of-state at high density as it is expected to occur in the core of neutron stars. In this article, we review the motivation for and the physics programme of CBM, including activities before the start of data taking in 2022, in the context of the worldwide efforts to explore high-density QCD matter.Comment: 15 pages, 11 figures. Published in European Physical Journal

    The learning styles neuromyth:when the same term means different things to different teachers

    Get PDF
    Alexia Barrable - ORCID: 0000-0002-5352-8330 https://orcid.org/0000-0002-5352-8330Although learning styles (LS) have been recognised as a neuromyth, they remain a virtual truism within education. A point of concern is that the term LS has been used within theories that describe them using completely different notions and categorisations. This is the first empirical study to investigate education professionals’ conceptualisation, as well as means of identifying and implementing LS in their classroom. A sample of 123 education professionals were administered a questionnaire consisting both closed- and open-ended questions. Responses were analysed using thematic analysis. LS were found to be mainly conceptualised within the Visual-Auditory-(Reading)-Kinaesthetic (VAK/VARK) framework, as well as Gardner’s multiple intelligences. Moreover, a lot of education professionals confused theories of learning (e.g., behavioural or cognitive theories) with LS. In terms of identifying LS, educators reported using a variety of methods, spanning from observation and everyday contact to the use of tests. The ways LS were implemented in the classroom were numerous, comprising various teaching aids, participatory techniques and motor activities. Overall, we argue that the extended use of the term LS gives the illusion of a consensus amongst educators, when a closer examination reveals that the term LS is conceptualised, identified and implemented idiosyncratically by different individuals. This study aims to be of use to pre-service and in-service teacher educators in their effort to debunk the neuromyth of LS and replace it with evidence-based practices.https://doi.org/10.1007/s10212-020-00485-236pubpub

    The serious games ecosystem: Interdisciplinary and intercontextual praxis

    Get PDF
    This chapter will situate academia in relation to serious games commercial production and contextual adoption, and vice-versa. As a researcher it is critical to recognize that academic research of serious games does not occur in a vaccum. Direct partnerships between universities and commercial organizations are increasingly common, as well as between research institutes and the contexts that their serious games are deployed in. Commercial production of serious games and their increased adoption in non-commercial contexts will influence academic research through emerging impact pathways and funding opportunities. Adding further complexity is the emergence of commercial organizations that undertake their own research, and research institutes that have inhouse commercial arms. To conclude, we explore how these issues affect the individual researcher, and offer considerations for future academic and industry serious games projects

    MOA: A hybride multidimensional data model for interactive data exploration based on object-relational analysis units

    No full text
    Our approach to model and represent multidimensional data for explorative data analysis [16] is guided by our experience in the context of epidemiological cancer research. The project CARLOS (Cancer Registry Lower-Saxony) developed the Epidemiological and Statistical Data Exploration System (CARESS) to support multidimensional analysis of health data. The system is based on an architecture that focuses on extensive interoperability between a database management system and several analysis and visualisation tools. To meet the requirements we developed the concepts of static data spaces and dynamic analysis units as abstract data types, so-called MOA (Model for OLAP Applications). We integrated MOA into an extendible, object relational, database system to provide a platform for the develope merit of advanced support for OIMP applications

    Hope for a Cure Through Earlier Detection of Hepatocellular Cancer

    No full text

    A Spatial Data Cube Concept to Support Data Analysis in Environmental Epidemiology

    No full text
    The project CARLOS (Cancer Registry Lower-Saxony) developed the Epidemiological and Statistical Data Exploration System (CARESS) to support multidimensional analysis of health data. The system is based on an architecture that focuses on extensive interoperability between a database management system and several analysis and visualisation tools. As spatial and statistical aspects of the data play an important role, CARESS provides special support for the integration of both. 1. Introduction CARLOS aims at providing software support for all important aspects of cancer registration. These are data collection, data security and environmental epidemiological research. The latter is based on exploratory analysis of cancer data [10, 7], where the objective is to detect possible health risks by analysing the spatio-temporal distribution of health data. For example, in order to detect an increased cancer rate around a nuclear power station, geographic information is needed. In general, health d..
    corecore